10 research outputs found
Overdetermined independent vector analysis
We address the convolutive blind source separation problem for the
(over-)determined case where (i) the number of nonstationary target-sources
is less than that of microphones , and (ii) there are up to
stationary Gaussian noises that need not to be extracted. Independent vector
analysis (IVA) can solve the problem by separating into sources and
selecting the top highly nonstationary signals among them, but this
approach suffers from a waste of computation especially when . Channel
reductions in preprocessing of IVA by, e.g., principle component analysis have
the risk of removing the target signals. We here extend IVA to resolve these
issues. One such extension has been attained by assuming the orthogonality
constraint (OC) that the sample correlation between the target and noise
signals is to be zero. The proposed IVA, on the other hand, does not rely on OC
and exploits only the independence between sources and the stationarity of the
noises. This enables us to develop several efficient algorithms based on block
coordinate descent methods with a problem specific acceleration. We clarify
that one such algorithm exactly coincides with the conventional IVA with OC,
and also explain that the other newly developed algorithms are faster than it.
Experimental results show the improved computational load of the new algorithms
compared to the conventional methods. In particular, a new algorithm
specialized for outperforms the others.Comment: To appear at the 45th International Conference on Acoustics, Speech,
and Signal Processing (ICASSP 2020
NoisyILRMA: Diffuse-Noise-Aware Independent Low-Rank Matrix Analysis for Fast Blind Source Extraction
In this paper, we address the multichannel blind source extraction (BSE) of a
single source in diffuse noise environments. To solve this problem even faster
than by fast multichannel nonnegative matrix factorization (FastMNMF) and its
variant, we propose a BSE method called NoisyILRMA, which is a modification of
independent low-rank matrix analysis (ILRMA) to account for diffuse noise.
NoisyILRMA can achieve considerably fast BSE by incorporating an algorithm
developed for independent vector extraction. In addition, to improve the BSE
performance of NoisyILRMA, we propose a mechanism to switch the source model
with ILRMA-like nonnegative matrix factorization to a more expressive source
model during optimization. In the experiment, we show that NoisyILRMA runs
faster than a FastMNMF algorithm while maintaining the BSE performance. We also
confirm that the switching mechanism improves the BSE performance of
NoisyILRMA.Comment: 5 pages, 3 figures, accepted for European Signal Processing
Conference 2023 (EUSIPCO 2023
ISS2: An Extension of Iterative Source Steering Algorithm for Majorization-Minimization-Based Independent Vector Analysis
A majorization-minimization (MM) algorithm for independent vector analysis
optimizes a separation matrix by minimizing a surrogate function of the form , where is
the number of sensors and positive definite matrices are constructed in each MM iteration. For ,
no algorithm has been found to obtain a global minimum of .
Instead, block coordinate descent (BCD) methods with closed-form update
formulas have been developed for minimizing and shown to be
effective. One such BCD is called iterative projection (IP) that updates one or
two rows of in each iteration. Another BCD is called iterative source
steering (ISS) that updates one column of the mixing matrix in
each iteration. Although the time complexity per iteration of ISS is times
smaller than that of IP, the conventional ISS converges slower than the current
fastest IP (called ) that updates two rows of in each
iteration. We here extend this ISS to that can update two
columns of in each iteration while maintaining its small time complexity.
To this end, we provide a unified way for developing new ISS type methods from
which as well as the conventional ISS can be immediately
obtained in a systematic manner. Numerical experiments to separate reverberant
speech mixtures show that our converges in fewer MM iterations
than the conventional ISS, and is comparable to .Comment: Accepted for publication in the 30th European Signal Processing
Conference (EUSIPCO 2022
Count matroids of group-labeled graphs
A graph G = (V, E) is called (k, β)-sparse if |F| β€ k|V (F)| β β for any nonempty F β E, where V (F) denotes the set of vertices incident to F. It is known that the family of the edge sets of (k, β)-sparse subgraphs forms the family of independent sets of a matroid, called the (k, β)-count matroid of G. In this paper we shall investigate lifts of the (k, β)- count matroids by using group labelings on the edge set. By introducing a new notion called near-balancedness, we shall identify a new class of matroids whose independence condition is described as a count condition of the form |F| β€ k|V (F)|ββ+Ξ±Ο (F) for some function Ξ±Ο determined by a given group labeling Ο on E